Generative models with kernel distance in data space

نویسندگان

چکیده

Generative models dealing with modeling a joint data distribution are generally autoencoder, or GAN based. Both have pros and cons, generating blurry images being unstable in training. We propose new generative model resembling classical transforming Gaussian noise into space without adversarial optimization. Training of the proposed is two-step procedure. First, we train an autoencoder-based architecture to manifold. Second, use Latent Trick map autoencoder’s latent space. The resulting Cramer-Wold (LCW) generator achieves competitive scores. Elimination training replacing discriminator kernel methods results stable procedure that not prone mode collapse. also show introduced can improve capabilities other latent-based models. validate on standard benchmarks compare it different approaches.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Latent Space of Generative Models

Several recent papers have treated the latent space of deep generative models, e.g., GANs or VAEs, as Riemannian manifolds. The argument is that operations such as interpolation are better done along geodesics that minimize path length not in the latent space but in the output space of the generator. However, this implicitly assumes that some simple metric such as L2 is meaningful in the output...

متن کامل

Probabilistic Distance Measures in Reproducing Kernel Hibert Space

Probabilistic distance measures are important quantities in many research areas. For example, the Chernoff distance (or the Bhattacharyya distance as its special example) is often used to bound the Bayes error in a pattern classification task and the Kullback-Leibler (KL) distance is a key quantity in the information theory literature. However, computing these distances is a difficult task and ...

متن کامل

Learning Discriminative Metrics via Generative Models and Kernel Learning

Metrics specifying distances between data points can be learned in a discriminative manner or fromgenerative models. In this paper, we show how to unify generative and discriminative learning of met-rics via a kernel learning framework. Specifically, we learn local metrics optimized from parametricgenerative models. These are then used as base kernels to construct a global kerne...

متن کامل

Kernel generative topographic mapping

A kernel version of Generative Topographic Mapping, a model of the manifold learning family, is defined in this paper. Its ability to adequately model non-i.i.d. data is illustrated in a problem concerning the identification of protein subfamilies from protein sequences.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2022

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2022.02.053